Learning with Compressible Priors

نویسنده

  • Volkan Cevher
چکیده

We describe a set of probability distributions, dubbed compressible priors, whose independent and identically distributed (iid) realizations result in p-compressible signals. A signal x ∈ R is called p-compressible with magnitude R if its sorted coefficients exhibit a power-law decay as |x|(i) . R · i−d, where the decay rate d is equal to 1/p. p-compressible signals live close to K-sparse signals (K N ) in the `r-norm (r > p) since their best K-sparse approximation error decreases withO ( R ·K1/r−1/p ) . We show that the membership of generalized Pareto, Student’s t, log-normal, Fréchet, and log-logistic distributions to the set of compressible priors depends only on the distribution parameters and is independent of N . In contrast, we demonstrate that the membership of the generalized Gaussian distribution (GGD) depends both on the signal dimension and the GGD parameters: the expected decay rate ofN -sample iid realizations from the GGD with the shape parameter q is given by 1/ [q log (N/q)]. As stylized examples, we show via experiments that the wavelet coefficients of natural images are 1.67-compressible whereas their pixel gradients are 0.95 log (N/0.95)-compressible, on the average. We also leverage the connections between compressible priors and sparse signals to develop new iterative re-weighted sparse signal recovery algorithms that outperform the standard `1-norm minimization. Finally, we describe how to learn the hyperparameters of compressible priors in underdetermined regression problems by exploiting the geometry of their order statistics during signal recovery.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Compressible Priors for High - Dimensional Statistics

We develop a principled way of identifying probability distributions whose independent and identically distributed (iid) realizations are compressible, i.e., can be approximated as sparse. We focus on the context of Gaussian random underdetermined linear regression (GULR) problems, where compressibility is known to ensure the success of estimators exploiting sparse regularization. We prove that...

متن کامل

Optimal Learning High-Order MRF Priors of Color Image

In this paper, we present an optimised learning algorithm for high-order Markov random fields (MRF) color image priors that capture the statistics of natural scenes and can be used for a variety of computer vision tasks. The proposed optimal learning algorithm is achieved by simplifying the estimation of partition function in the learning model. The parameters in MRF color image priors are lear...

متن کامل

Attitude toward information and learning under multiple priors

This paper studies learning under multiple priors by characterizing the decision maker’s attitude toward information. She is incredulous if she integrates new information with respect to only those measures that minimizes the likelihood of the new information and credulous if she uses the maximum likelihood procedure to update her priors. Both updating rules expose her to dynamic inconsistency....

متن کامل

Machine Learning with Dirichlet and Beta Process Priors: Theory and Applications

Machine Learning) Machine Learning with Dirichlet and Beta Process Priors: Theory and Applications by John Paisley Department of Electrical & Computer Engineering Duke University

متن کامل

Learning Approximately Objective Priors

Informative Bayesian priors are often difficult to elicit, and when this is the case, modelers usually turn to noninformative or objective priors. However, objective priors such as the Jeffreys and reference priors are not tractable to derive for many models of interest. We address this issue by proposing techniques for learning reference prior approximations: we select a parametric family and ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2009